Boosting Learning Algorithm for Pattern Recognition and Beyond

نویسندگان

  • Osamu Komori
  • Shinto Eguchi
چکیده

This paper discusses recent developments for pattern recognition focusing on boosting approach in machine learning. The statistical properties such as Bayes risk consistency for several loss functions are discussed in a probabilistic framework. There are a number of loss functions proposed for different purposes and targets. A unified derivation is given by a generator function U which naturally defines entropy, divergence and loss function. The class of U-loss functions associates with the boosting learning algorithms for the loss minimization, which includes AdaBoost and LogitBoost as a twin generated from Kullback-Leibler divergence, and the (partial) area under the ROC curve. We expand boosting to unsupervised learning, typically density estimation employing U-loss function. Finally, a future perspective in machine learning is discussed. key words: AUC; boosting; entropy; divergence; ROC; U-loss function; density estimation.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Combining Bagging and Boosting

Bagging and boosting are among the most popular resampling ensemble methods that generate and combine a diversity of classifiers using the same learning algorithm for the base-classifiers. Boosting algorithms are considered stronger than bagging on noisefree data. However, there are strong empirical indications that bagging is much more robust than boosting in noisy settings. For this reason, i...

متن کامل

Object recognition using boosted adaptive features

Most existing pattern recognition techniques are based on using a fixed set of features, hand-crafted or learned by non supervised methods, to classify the data samples. But in natural and uncontrolled environments, sometimes it can be useful to use more adaptive classifiers. We propose a learning algorithm based on a boosting scheme where features are adapted to the classification task, result...

متن کامل

Boosted Image Classification: An Empirical Study

The rapid pace of research in the fields of machine learning and image comparison has produced powerful new techniques in both areas. At the same time, research has been sparse on applying the best ideas from both fields to image classification and other forms of pattern recognition. This paper combines boosting with stateof-the-art methods in image comparison to carry out a comparative evaluat...

متن کامل

Local Parametric Modeling via U-Divergence

This paper discusses a local parametric modeling by the use of U-divergence in a statistical pattern recognition. The class of U-divergence measures commonly has an empirical loss function in a simple form including Kullback-Leibler divergence, the power divergence and mean squared error. We propose a minimization algorithm for parametric models of sequentially increasing dimension by incorpora...

متن کامل

Al-Alaoui Pattern Recognition Algorithm: A MSE Asymptotic Bayesian Approach to Boosting

The relation of the Al-Alaoui pattern recognition algorithm to the boosting and bagging approaches to pattern recognition is delineated. It is shown that the Al-Alaoui algorithm shares with bagging and boosting the concepts of replicating and weighting instances of the training set. Additionally it is shown that the Al-Alaoui algorithm provides a Mean Square Error, MSE, asymptotic Bayesian appr...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • IEICE Transactions

دوره 94-D  شماره 

صفحات  -

تاریخ انتشار 2011